Proximal Quasi-Newton Methods for Nondifferentiable Convex Optimization

نویسنده

  • R. B. Schnabel
چکیده

Some global convergence properties of a variable metric algorithm for minimization without exact line searches, in R. 23 superlinear convergent algorithm for minimizing the Moreau-Yosida regularization F. However, this algorithm makes use of the generalized Jacobian of F, instead of matrices B k generated by a quasi-Newton formula. Moreover, the line search is performed on the function F , rather than f, which is usually quite expensive. In [29], a modication of the algorithm of [12] is presented and its local convergence properties are studied. The algorithm proposed in this paper uses a quasi-Newton update of B k and line search is done on the function f. The resulting algorithm is globally and superlinearly convergent under suitable conditions, and it is imple-mentable. Numerical results show its good performance for piecewise quadratic optimization problems. Acknowledgements. We are grateful to J. Womersley and two referees for their helpful comments. We thank K. Kiwiel for providing his FORTRAN NDO test problems and related references. On the local super-linear convergence of a matrix secant implementation of the variable metric proximal point algorithm, On the superlinear convergence of the variable metric proximal point algorithm using Broyden and BFGS matrix secant updating,

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Proximal quasi-Newton methods for nondifferentiable convex optimization

This paper proposes an implementable proximal quasi-Newton method for minimizing a nondifferentiable convex function f in <n . The method is based on Rockafellar’s proximal point algorithm and a cutting-plane technique. At each step, we use an approximate proximal point p(xk) of xk to define a vk ∈ ∂ k f(p(xk))with k ≤ α‖vk‖,where α is a constant. The method monitors the reduction in the value ...

متن کامل

Quasi-Newton Bundle-Type Methods for Nondifferentiable Convex Optimization

In this paper we provide implementable methods for solving nondifferentiable convex optimization problems. A typical method minimizes an approximate Moreau–Yosida regularization using a quasi-Newton technique with inexact function and gradient values which are generated by a finite inner bundle algorithm. For a BFGS bundle-type method global and superlinear convergence results for the outer ite...

متن کامل

A preconditioning proximal Newton method for nondifferentiable convex optimization

We propose a proximal Newton method for solving nondiieren-tiable convex optimization. This method combines the generalized Newton method with Rockafellar's proximal point algorithm. At each step, the proximal point is found approximately and the regu-larization matrix is preconditioned to overcome inexactness of this approximation. We show that such a preconditioning is possible within some ac...

متن کامل

Adaptive Fista

In this paper we propose an adaptively extrapolated proximal gradient method, which is based on the accelerated proximal gradient method (also known as FISTA), however we locally optimize the extrapolation parameter by carrying out an exact (or inexact) line search. It turns out that in some situations, the proposed algorithm is equivalent to a class of SR1 (identity minus rank 1) proximal quas...

متن کامل

Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates

In [19], a general, inexact, efficient proximal quasi-Newton algorithm for composite optimization problems has been proposed and a sublinear global convergence rate has been established. In this paper, we analyze the convergence properties of this method, both in the exact and inexact setting, in the case when the objective function is strongly convex. We also investigate a practical variant of...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1998